Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Large-scale Merging
# Large-scale Merging
Phalanx 512x460M MoE
Apache-2.0
LiteLlama-460M-1T is a lightweight mixture of experts model with 512 experts, suitable for efficient inference and text generation tasks.
Large Language Model
Transformers
English
P
Kquant03
28
2
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase